Tag
30 articles
This article explains the concept of AI chips, how they work, and why companies like Anthropic are exploring custom chip design as a strategic move to optimize performance and control their AI infrastructure.
Google and Intel have deepened their AI infrastructure partnership to co-develop custom chips amid global CPU shortages and rising demand for AI processing power.
Elon Musk's Terafab chip venture with Intel raises questions about the partnership's true scope and whether it can succeed in the competitive AI hardware market.
Intel has joined Elon Musk’s Terafab as a key foundry partner in a $25 billion AI chip megaproject, marking a significant step in Intel's recovery strategy.
Intel has joined Elon Musk's Terafab AI chip project in Austin, Texas, to help design and build a massive facility that will supply chips to SpaceX and Tesla.
Learn what AI chips are, how they work, and why companies like Uber and Amazon are investing heavily in them to gain competitive advantages in the AI race.
Nvidia's $2 billion investment in Marvell is more than a financial stake—it's a strategic move to secure long-term revenue through ecosystem lock-in.
AI chip startup Rebellions raises $400 million at $2.3B valuation, positioning itself as a challenger to Nvidia's dominance in AI inference hardware.
South Korean AI chipmaker Rebellions has raised $400 million in a pre-IPO round, valuing the company at $2.34 billion. The funding, led by Samsung, SK Hynix, and Aramco, signals strong investor confidence in its AI inference chip technology.
Arm has broken from its licensing-only model by manufacturing its first in-house chip, designed specifically for AI data centers. This marks a major strategic shift for the company, signaling its intent to compete directly in the high-growth AI chip market.
Arm has entered the AI chip market with new hardware designed to accelerate machine learning workloads, securing early customers including Meta, OpenAI, Cerebras, and Cloudflare.
Learn what AI inference chips are, how they work, and why they're crucial for making AI systems faster and more efficient. This explainer explains the basics of inference chips using simple analogies.